2 research outputs found

    putEMG -- a surface electromyography hand gesture recognition dataset

    Full text link
    In this paper, we present a putEMG dataset intended for evaluation of hand gesture recognition methods based on sEMG signal. The dataset was acquired for 44 able-bodied subjects and include 8 gestures (3 full hand gestures, 4 pinches, and idle). It consists of uninterrupted recordings of 24 sEMG channels from the subject's forearm, RGB video stream and depth camera images used for hand motion tracking. Moreover, exemplary processing scripts are also published. putEMG dataset is available under Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license at: https://www.biolab.put.poznan.pl/putemg-dataset/. The dataset was validated regarding sEMG amplitudes and gesture recognition performance. The classification was performed using state-of-the-art classifiers and feature sets. Accuracy of 90% was achieved for SVM classifier utilising RMS feature and for LDA classifier using Hudgin's and Du's feature sets. Analysis of performance for particular gestures showed that LDA/Du combination has significantly higher accuracy for full hand gestures, while SVM/RMS performs better for pinch gestures. Presented dataset can be used as a benchmark for various classification methods, evaluation of electrode localisation concepts, or development of classification methods invariant to user-specific features or electrode displacement

    PUT-Hand—Hybrid Industrial and Biomimetic Gripper for Elastic Object Manipulation

    No full text
    In this article, the design of a five-fingered anthropomorphic gripper is presented specifically designed for the manipulation of elastic objects. The manipulator features a hybrid design, being equipped with three fully actuated fingers for precise manipulation, and two underactuated, tendon-driven digits for secure power grasping. For ease of reproducibility, the design uses as many off-the-shelf and 3D-printed components as possible. The on-board controller circuit and firmware are also presented. The design includes resistive position and angle sensors in each joint, resulting in full joint observability. The controller has a position-based controller integrated, along with USB communication protocol, enabling gripper state reporting and direct motor control from a PC. A high-level driver operating as a Robot Operating System node is also provided. All drives and circuitry of the PUT-Hand are integrated within the hand itself. The sensory system of the hand includes tri-axial optical force sensors placed on fully actuated fingers’ fingertips for reaction force measurement. A set of experiments is provided to present the motion and perception capabilities of the gripper. All design files and source codes are available online under CC BY-NC 4.0 and MIT licenses
    corecore